Backward Elimination Methods for Associative Memory Network Pruning

نویسندگان

  • Xia Hong
  • Christopher J. Harris
  • Martin Brown
  • Sheng Chen
چکیده

Three hybrid data based model construction/pruning formula are introduced by using backward elimination as automatic postprocessing approaches to improved model sparsity. Each of these approaches is based on a composite cost function between the model fit and one of three terms of A-/D-optimality / (parameter 1-norm in basis pursuit) that determines a pruning process. The A-/D-optimality based pruning formula contain some orthogonalisation between the pruned model and the deleted regressor. The basis pursuit cost function is derived as a simple formula without need for an orthogonalisation process. These different approaches to parsimonious data based modelling are applied to the same numerical examples in parallel to demonstrate their computational effectiveness.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Computational Aspects of Synaptic Elimination

Research in humans and primates shows that the developmental course of the brain involves synaptic over-growth followed by marked selective pruning which eliminates about half of the synapses of the child. Previous explanations have suggested that this intriguing, seemingly wasteful, phenomenon is utilized to remove 'erroneous' synapses which were studied at an early stage. This thesis proves t...

متن کامل

An Autoassociative Neural Network Model of Paired-Associate Learning

Hebbian heteroassociative learning is inherently asymmetric. Storing a forward association, from item A to item B, enables recall of B (given A), but does not permit recall of A (given B). Recurrent networks can solve this problem by associating A to B and B back to A. In these recurrent networks, the forward and backward associations can be differentially weighted to account for asymmetries in...

متن کامل

Evolved Asymmetry and Dilution of Random Synaptic Weights in Hop eld Network Turn a Spin-glass Phase into Associative Memory

We apply evolutionary computations to Hop eld's neural network model of associative memory. Previously, we reported that a genetic algorithm can enlarge the Hebb rule associative memory by pruning some of over-loaded Hebbian synaptic weights. In this paper, we present the genetic algorithm also evolves random synaptic weights to store some number of patterns.

متن کامل

Incremental Learning of Limited Kernel Associative Memory

This paper proposes a limited kernel associative memory, where the number of kernels is limited to a certain number. This model aims to be used on embedded systems with a small amount of storage space. The learning algorithm for the kernel associative memory is an improved version of the limited general regression neural network, which was proposed by one of the authors. In the experiments, we ...

متن کامل

Sequence Learning and Planning on Associative Spiking Neural Network

We have been building an auto/heteroassociative spiking neural network combined with a working memory model. In this model, a statedriven forward sequence and a goal-driven backward sequence on the associative network are respectively represented by a sequence of synchronous firing in a particular gamma subcycle during a theta oscillation. These forward and backward sequence firings are transmi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Int. J. Hybrid Intell. Syst.

دوره 1  شماره 

صفحات  -

تاریخ انتشار 2004